Matrix Completion has No Spurious Local Minimum

نویسندگان

  • Rong Ge
  • Jason D. Lee
  • Tengyu Ma
چکیده

Matrix completion is a basic machine learning problem that has wide applications, especially in collaborative filtering and recommender systems. Simple non-convex optimization algorithms are popular and effective in practice. Despite recent progress in proving various non-convex algorithms converge from a good initial point, it remains unclear why random or arbitrary initialization suffices in practice. We prove that the commonly used non-convex objective function for positive semidefinite matrix completion has no spurious local minima – all local minima must also be global. Therefore, many popular optimization algorithms such as (stochastic) gradient descent can provably solve positive semidefinite matrix completion with arbitrary initialization in polynomial time. The result can be generalized to the setting when the observed entries contain noise. We believe that our main proof strategy can be useful for understanding geometric properties of other statistical problems involving partial or noisy observations.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Memory-efficient Kernel PCA via Partial Matrix Sampling and Nonconvex Optimization: a Model-free Analysis of Local Minima

Kernel PCA is a widely used nonlinear dimension reduction technique in machine learning, but storing the kernel matrix is notoriously challenging when the sample size is large. Inspired by [YPCC16], where the idea of partial matrix sampling followed by nonconvex optimization is proposed for matrix completion and robust PCA, we apply a similar approach to memoryefficient Kernel PCA. In theory, w...

متن کامل

No Spurious Local Minima in Nonconvex Low Rank Problems: A Unified Geometric Analysis

In this paper we develop a new framework that captures the common landscape underlying the common non-convex low-rank matrix problems including matrix sensing, matrix completion and robust PCA. In particular, we show for all above problems (including asymmetric cases): 1) all local minima are also globally optimal; 2) no highorder saddle points exists. These results explain why simple algorithm...

متن کامل

Graph Matrix Completion in Presence of Outliers

Matrix completion problem has gathered a lot of attention in recent years. In the matrix completion problem, the goal is to recover a low-rank matrix from a subset of its entries. The graph matrix completion was introduced based on the fact that the relation between rows (or columns) of a matrix can be modeled as a graph structure. The graph matrix completion problem is formulated by adding the...

متن کامل

Gradient Descent Learns One-hidden-layer CNN: Don't be Afraid of Spurious Local Minima

We consider the problem of learning a one-hidden-layer neural network with non-overlapping convolutional layer and ReLU activation function, i.e., f(Z; w,a) = ∑ j ajσ(w Zj), in which both the convolutional weights w and the output weights a are parameters to be learned. We prove that with Gaussian input Z, there is a spurious local minimum that is not a global mininum. Surprisingly, in the pres...

متن کامل

بهبود روش انتگرال‌گیری دقیق مرتبه اول برای تحلیل دینامیکی سازه‌ها با معکوس‌سازی ماتریس حالت

For solving the dynamic equilibrium equation of structures, several second-order numerical methods have so far been proposed. In these algorithms, conditional stability, period elongation, amplitude error, appearance of spurious frequencies and dependency of the algorithms to the time steps are the crucial problems. Among the numerical methods, Newmark average acceleration algorithm, regardl...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016